Content Workers in Africa Sue Facebook, Report Poor Work Conditions
2023-07-12
LRC
TXT
大字
小字
滚动
全页
1Nathan Nkuzimana worked for a Kenyan company called Sama until early 2023.
2Sama is a technology company that sold content moderation services to Facebook.
3Content moderators look at material posted to social media websites such as Facebook for quality control.
4The 33-year-old Nkuzimana came to Kenya from Burundi to work for Sama.
5He said he looked at violent and sexual images and videos every workday.
6He said he saw a child being sexually violated in one post and a woman being killed in another.
7Nkuzimana said seeing such horrors damaged his mental health.
8Nkuzimana is not alone. He joined about 200 workers from Kenya who are part of a larger group in Africa suing Sama and Facebook.
9They say they suffered through harmful working conditions.
10It is the first known international legal case related to Facebook content moderators.
11In 2020, Facebook settled a legal action against it brought by content workers in the U.S.
12The Sama workers did their jobs at a Nairobi office.
13They looked at material that came from Facebook users in Africa.
14Their job was to remove harmful or illegal posts.
15The workers in Africa are seeking $1.6 billion in compensation for their work.
16They say their employers did not pay them enough nor provide them enough mental health support.
17They also say Sama and Facebook should continue to pay them while courts consider the case.
18Facebook and Sama both deny the workers' accusations.
19A Kenyan court is hearing the case.
20Many of the workers came to Kenya from other countries because Sama paid well.
21Some earned $429 per month and workers from other countries, including Nkuzimana, made a little more.
22People who follow technology news say the group in Kenya is the most visible because they are also pushing to form a workers' group, or union.
23While the case is being decided, however, the workers are not being paid.
24Their work permits have time limitations also.
25Many of the workers left their home countries because they heard about good pay.
26But they also wanted to leave because of conflicts at home.
27Fasica Gebrekidan is from Ethiopia.
28She left home for Kenya because she did not want to get caught up in her country's civil war in the northern Tigray region.
29She knew of the bad things happening in her country.
30When she started working for Sama, she said she saw frightening images and videos.
31In order to make a decision about a video, she would have to watch the first 50 seconds and the last 50 seconds.
32She would see images of war and rape.
33"You run away from the war, then you have to see the war," Fasica said.
34"It was just a torture for us."
35Many of the moderators said they started the work at Sama with good feelings.
36Nkuzimana said he and his co-workers felt like "heroes to the community."
37He went on to say that people feel safe looking at Facebook because of workers like him.
38He compared the workers to soldiers who might be hurt so everyone else can be safe.
39But those good feelings turned bad after hours watching harmful material.
40Nkuzimana said he would come home and close himself in his room so he would not have to speak with his family about what he saw that day.
41The workers said the U.S.-based Sama did not help the moderators work through what they saw.
42The company, however, said mental health professionals were available to all employees.
43Sarah Roberts is an expert in content moderation at the University of California, Los Angeles.
44She said workers might risk their mental health for a chance to work in technology and make good money.
45When companies like Sama are hired to do work for Facebook, Roberts, explained, it permits Facebook to say the workers are not their employees.
46In addition, she said, the workers are telling "the story of an exploitative industry."
47Fasica said she is worried she will never be able to have a normal life.
48She always sees the images in her head.
49She called it "garbage" and worried it will be in her head forever.
50She said Facebook should know what is going on with the workers.
51"They should care about us," she said.
52I'm Dan Friedell. And I'm Caty Weaver.
1Nathan Nkuzimana worked for a Kenyan company called Sama until early 2023. 2Sama is a technology company that sold content moderation services to Facebook. Content moderators look at material posted to social media websites such as Facebook for quality control. 3The 33-year-old Nkuzimana came to Kenya from Burundi to work for Sama. He said he looked at violent and sexual images and videos every workday. 4He said he saw a child being sexually violated in one post and a woman being killed in another. Nkuzimana said seeing such horrors damaged his mental health. 5Nkuzimana is not alone. He joined about 200 workers from Kenya who are part of a larger group in Africa suing Sama and Facebook. They say they suffered through harmful working conditions. 6It is the first known international legal case related to Facebook content moderators. In 2020, Facebook settled a legal action against it brought by content workers in the U.S. 7The Sama workers did their jobs at a Nairobi office. They looked at material that came from Facebook users in Africa. Their job was to remove harmful or illegal posts. 8The workers in Africa are seeking $1.6 billion in compensation for their work. They say their employers did not pay them enough nor provide them enough mental health support. They also say Sama and Facebook should continue to pay them while courts consider the case. 9Facebook and Sama both deny the workers' accusations. 10A Kenyan court is hearing the case. 11Many of the workers came to Kenya from other countries because Sama paid well. Some earned $429 per month and workers from other countries, including Nkuzimana, made a little more. 12People who follow technology news say the group in Kenya is the most visible because they are also pushing to form a workers' group, or union. 13While the case is being decided, however, the workers are not being paid. Their work permits have time limitations also. 14Away from home 15Many of the workers left their home countries because they heard about good pay. But they also wanted to leave because of conflicts at home. Fasica Gebrekidan is from Ethiopia. She left home for Kenya because she did not want to get caught up in her country's civil war in the northern Tigray region. 16She knew of the bad things happening in her country. When she started working for Sama, she said she saw frightening images and videos. In order to make a decision about a video, she would have to watch the first 50 seconds and the last 50 seconds. She would see images of war and rape. 17"You run away from the war, then you have to see the war," Fasica said. "It was just a torture for us." 18Many of the moderators said they started the work at Sama with good feelings. Nkuzimana said he and his co-workers felt like "heroes to the community." He went on to say that people feel safe looking at Facebook because of workers like him. He compared the workers to soldiers who might be hurt so everyone else can be safe. 19But those good feelings turned bad after hours watching harmful material. Nkuzimana said he would come home and close himself in his room so he would not have to speak with his family about what he saw that day. 20The workers said the U.S.-based Sama did not help the moderators work through what they saw. The company, however, said mental health professionals were available to all employees. 21Sarah Roberts is an expert in content moderation at the University of California, Los Angeles. She said workers might risk their mental health for a chance to work in technology and make good money. 22When companies like Sama are hired to do work for Facebook, Roberts, explained, it permits Facebook to say the workers are not their employees. In addition, she said, the workers are telling "the story of an exploitative industry." 23Forever in their heads 24Fasica said she is worried she will never be able to have a normal life. She always sees the images in her head. She called it "garbage" and worried it will be in her head forever. 25She said Facebook should know what is going on with the workers. "They should care about us," she said. 26I'm Dan Friedell. And I'm Caty Weaver. 27Dan Friedell adapted this story for Learning English based on a story by The Associated Press. 28_____________________________________________________________________ 29Words in This Story 30sue -v. to start a legal action because you think you have been hurt by someone else 31compensation -n. another word for payment 32visible -adj. well-known 33______________________________________________________________________ 34We want to hear from you. Did you know so many people posted offensive material on Facebook? 35Here is how our comment system works: 36Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.